Metric Entropy and Minimax Risk in Classification

نویسندگان

  • David Haussler
  • Manfred Opper
چکیده

We apply recent results on the minimax risk in density estimation to the related problem of pattern classiication. The notion of loss we seek to minimize is an information theoretic measure of how well we can predict the classiication of future examples, given the classiication of previously seen examples. We give an asymptotic characterization of the minimax risk in terms of the metric entropy properties of the class of distributions that might be generating the examples. We then use these results to characterize the minimax risk in the special case of noisy two-valued classiication problems in terms of the Assouad density and the Vapnik-Chervonenkis dimension.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Metric Entropy and Minimax Risk in Classi cation

We apply recent results on the minimax risk in density esti mation to the related problem of pattern classi cation The notion of loss we seek to minimize is an information theoretic measure of how well we can predict the classi cation of future examples given the classi cation of previously seen examples We give an asymptotic characterization of the minimax risk in terms of the metric entropy p...

متن کامل

A general minimax result for relative entropy

Suppose Nature picks a probability measure P on a complete separable metric space X at random from a measurable set P fP g Then without knowing a statistician picks a measure Q on X Finally the statistician su ers a loss D P jjQ the relative entropy between P and Q We show that the minimax and maximin values of this game are always equal and there is always a minimax strategy in the closure of ...

متن کامل

Replicant Compression Coding in Besov Spaces

We present here a new proof of the theorem of Birman and Solomyak on the metric entropy of the unit ball of a Besov space B π,q on a regular domain of R . The result is: if s − d(1/π − 1/p)+ > 0, then the Kolmogorov metric entropy satisfies H( ) ∼ −d/s. This proof takes advantage of the representation of such spaces on wavelet type bases and extends the result to more general spaces. The lower ...

متن کامل

Minimax nonparametric classification - Part I: Rates of convergence

This paper studies minimax aspects of nonparametric classification. We first study minimax estimation of the conditional probability of a class label, given the feature variable. This function, say f , is assumed to be in a general nonparametric class. We show the minimax rate of convergence under square L2 loss is determined by the massiveness of the class as measured by metric entropy. The se...

متن کامل

Information-Theoretic Determination of Minimax Rates of Convergence

In this paper, we present some general results determining minimax bounds on statistical risk for density estimation based on certain information-theoretic considerations. These bounds depend only on metric entropy conditions and are used to identify the minimax rates of convergence.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1997